This should be a main part of the Integrated Master Plan (IMP)
Develop an ISD/SAT evaluation plan. Evaluation should be integrated throughout each activity of the ISD/SAT process. One of the top priorities of every manager is to develop an ISD/SAT evaluation plan to ensure that the instructional development process is of high quality. Quality is a continuous process when developing a new instructional system or revising an existing system. The concern for quality continues throughout each phase of the development process. Evaluating or assessing the quality of the ISD/SAT evaluation plan is necessary to establish what and how you are to evaluate during the instructional development process. The plan, which is the benchmark for quality, ensures that the ISD/SAT process results in a quality instructional system.
#1 Check on an training evaluation should be to ensure that the work requirement, aligns with the learning objective, which should be aligned to the assessment strategy. If those are aligned, good. If not, you probable have a problem that needs to be fixed before continuing.
What is in the plan? The ISD/SAT evaluation plan includes information sufficient to ensure that ISD/SAT results in quality process and products. The plan may include, but is not limited to, the following information:
Identification of responsibilities including tasking
Scope and purpose of the evaluation
How and when the evaluation activities are to be accomplished
Documentation and report requirements
Example of an Evaluation Plan:
Evaluation Plan For (Approved Name of Training System Program)
Introduction: The introduction should provide a brief overview of the training evaluation document.
Training Evaluation Planning Data: The training evaluation planning data can include the following:
Purpose of the planned evaluation, or validation.
Scope of the evaluation (e.g., learning objectives, critical standards).
Type of planned evaluation (e.g., summative, formative, training effectiveness, training capabilities, cost-effectiveness, test items, course or materials review).
Method of evaluation (e.g., empirical, analytic, internal, external).
Types of information to be collected (e.g., opinion, observation, performance).
Procedures to be used for collecting information as follows:
Criteria to select size and composition of target population sample.
Criteria for site selection.
Methods for collection of information about student target population sample participants.
Criteria for selection of instructors.
Methods for collection of information about instructor participants.
Methods to be used to prepare facilities and equipment prior to conduct of evaluation.
Methods to be used to prepare students and instructors to participate in the evaluation.
Methods for administration of the evaluation.
Methods for collection of student reactions to the training during the presentation.
Methods for observation of the presentation of training.
Methods for collection of student and instructor comments at the conclusion of training.
Methods for the recording of data.
Methods for the conduct of interviews.
Methods for participants to provide additional data following the completion of the actual evaluation.
Methods for determining validity and reliability of the evaluation.
Methods for conducting individual trials.
Methods for conducting small group trials.
Methods for conducting operational trials.
Methods for conducting test validation.
Methods for conducting content validation.
Methods for correcting training materials.
Methods for revising training materials.
Methods for conducting tests.
Methods for correcting tests.
Methods for revising tests.
Methods for validating corrected materials.
Methods for validating corrected tests.
Procedures to be used for data analysis as follows:
Criteria for assessing performance.
Criteria and procedures for validation of the evaluation.
Analytical treatment of data (e.g., statistical treatment).
Criteria and procedures for estimating criticality of deficiencies.
Criteria for accepting tests as validated.
Criteria for accepting training materials as validated.
Criteria and procedures for demonstrating impact on cost effectiveness and Return On Investment (ROI).
Procedures to be used for reporting the findings.
Procedures to be used for reporting the conclusions.
Procedures to be used for reporting the recommendations.
Procedures used to report changes required based on trials.
The data collection instruments to be used (e.g., tests, checklists, questionnaires).
Schedule for data collection and performing the evaluations and validation trials.
A description of resource requirements (e.g., personnel, materials, special equipment, travel funds, facilities) for each evaluation or validation trial.
Responsibility for testing and responsibility for conducting the evaluations and validation trials.
Roles and responsibilities of all personnel to be involved (e.g., command, students, evaluators, graduates, supervisors of graduates) in each evaluation and validation trial.
Identification of the agencies and decision authorities who will receive the report.
Listing of the proposed evaluation sites.
Training Evaluation Results Data: This data can provide a description of the purpose, scope and intended use of the training evaluation results, and shall include the following:
Introduction. The introduction shall describe the following:
Method of evaluation.
Types of information collected.
Procedures and instruments used for collecting information.
Procedures for data analysis.
Background paragraph that explains history and circumstances requiring evaluation.
Background paragraph that explains history and circumstances requiring validation.
Problem paragraph that provides a statement of any problem or deficiency discovered by the evaluation.
Problem paragraph that provides a statement of any problem or deficiency discovered by the validation.
Data collection deficiencies.
Results of individual trials.
Results of small group validation trials.
Results of operational validation trials.
Results of test validation. Changes made to training materials as a result of previous validation trials.
Results of cost effectiveness and Return On Investment (ROI) analyses.
Summary of Findings: The summary can provide a description of the data collected during the evaluation.
Conclusion and Recommendations: Conclusions and recommendations can include:
A description of whether or not the product met the established validation criteria and is acceptable for training. If the product did not meet the criteria, this data shall provide specific recommendations for correcting product deficiencies and its impact on the delivery schedule. In the case of test and training material validation trials, this data shall provide descriptions of changes made to test and training materials after each validation performed.
A description of the cost effectiveness and Return On Investment (ROI) benefits of the training.
Appendices The following appendices shall be included:
Appendix A: Provide a description of resources used, to include:
The time required to conduct each evaluation or validation trial and analyze the data.
The facilities and equipment used.
A summary of the demographic data for participating students.
A summary of the staffing requirements for participating instructors and support personnel. Criteria used to determine master versus non-master.
Appendix B: Provide a listing of the participating organizations and the evaluation responsibilities performed by each organization.
Appendix C: Shall include copies of all data collection instruments used during the evaluation.
Appendix E: Shall show the schedule of all evaluation, trials, and validation events.
Appendix F: Shall provide a summary of any literature reviews of the relevant findings of any previous research on this or similar training products, or addressing this or similar training products, or addressing this or similar training deficiencies.
Appendix G: Shall provide a learning objectives paragraph which describes the specific determinations made, and which specifies the essential elements of analysis that were addressed in accomplishing each learning objective and associated test item.
On contracted efforts NETC has added a Quality Assurance Surveillance Plan (QASP) that will evaluate each deliverable.
Develop an ISD/SAT evaluation plan. Evaluation should be integrated throughout each activity of the ISD/SAT process. One of the top priorities of every manager is to develop an ISD/SAT evaluation plan to ensure that the instructional development process is of high quality. Quality is a continuous process when developing a new instructional system or revising an existing system. The concern for quality continues throughout each phase of the development process. Evaluating or assessing the quality of the ISD/SAT evaluation plan is necessary to establish what and how you are to evaluate during the instructional development process. The plan, which is the benchmark for quality, ensures that the ISD/SAT process results in a quality instructional system.
#1 Check on an training evaluation should be to ensure that the work requirement, aligns with the learning objective, which should be aligned to the assessment strategy. If those are aligned, good. If not, you probable have a problem that needs to be fixed before continuing.
What is in the plan? The ISD/SAT evaluation plan includes information sufficient to ensure that ISD/SAT results in quality process and products. The plan may include, but is not limited to, the following information:
Example of an Evaluation Plan:
Evaluation Plan For (Approved Name of Training System Program)
Introduction: The introduction should provide a brief overview of the training evaluation document.
Training Evaluation Planning Data: The training evaluation planning data can include the following:
Responsibility for testing and responsibility for conducting the evaluations and validation trials.
Training Evaluation Results Data: This data can provide a description of the purpose, scope and intended use of the training evaluation results, and shall include the following:
On contracted efforts NETC has added a Quality Assurance Surveillance Plan (QASP) that will evaluate each deliverable.